skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Palaguachi, Chris"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Systematic reviews are a time-consuming yet effective approach to understanding research trends. While researchers have investigated how to speed up the process of screening studies for potential inclusion, few have focused on to what extent we can use algorithms to extract data instead of human coders. In this study, we explore to what extent analyses and algorithms can produce results similar to human data extraction during a scoping review—a type of systematic review aimed at understanding the nature of the field rather than the efficacy of an intervention—in the context of a never before analyzed sample of studies that were intended for a scoping review. Specifically, we tested five approaches: bibliometric analysis with VOSviewer, latent Dirichlet allocation (LDA) with bag of words, k-means clustering with TF-IDF, Sentence-BERT, or SPECTER, hierarchical clustering with Sentence-BERT, and BERTopic. Our results showed that topic modeling approaches (LDA/BERTopic) and k-means clustering identified specific, but often narrow research areas, leaving a substantial portion of the sample unclassified or in unclear topics. Meanwhile, bibliometric analysis and hierarchical clustering with SBERT were more informative for our purposes, identifying key author networks and categorizing studies into distinct themes as well as reflecting the relationships between themes, respectively. Overall, we highlight the capabilities and limitations of each method and discuss how these techniques can complement traditional human data extraction methods. We conclude that the analyses tested here likely cannot fully replace human data extraction in scoping reviews but serve as valuable supplements. 
    more » « less
    Free, publicly-accessible full text available December 1, 2025